Geodesic Learning Algorithms Over Flag Manifolds

نویسندگان

  • Yasunori Nishimori
  • Shotaro Akaho
  • Samer Abdallah
  • Mark D. Plumbley
چکیده

Recently manifold structures have attracted attentions in two folds in the machine learning literature. One is in the manifold learning problem, that is learning the intrinsic manifold structure in high dimensional datasets. Another is in the information geometric approach to learning – exploiting the geometry of the parameter space of learning machines such as neural networks for improving conventional learning algorithms [1]. In this presentation we discuss the use of manifolds for the latter aim, particularly, we investigate the flag manifold arising from one layer neural networks for solving subspace ICA problems. When the parameter space of neural networks forms a manifold, the learning equation is defined over that manifold and should be integrated taking the manifold structure into account. The Riemannian learning method [3], by introducing a Riemannian metric into the manifold, yields a discretization scheme for solving differential equations on the manifold: integration of the differential equation is performed along piecewise geodesics which approximate the original learning trajectory. This Riemannian method was successfully applied to the orthogonal group, i.e. the Lie group of orthogonal matrices, for the non-negative ICA problem [5]. In the context of optimization this Riemannian approach was also considered over the Stiefel and the Grassmann manifolds in [2]. The Grassmann manifold just considers the set of subspaces of a fixed dimension, while the flag manifold consists of the set of the direct sum of subspaces and includes the Grassmannian manifold as a special case. Therefore the flag manifold can deal with several subspaces simultaneously and is suitable for tackling subspace ICA problems. We extend the formulas obtained in [2] to the flag manifold. The effectiveness of the Riemannian learning method over the standard Euclidean algorithms is illustrated in independent subspace analysis (ISA) and complex ICA experiments. Also, the issue of local minima in ISA is discussed. We combine the Riemannian learning method over the flag manifold and an MCMC method to overcome the problem of local minima of the ISA cost function.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Geometry Preserving Kernel over Riemannian Manifolds

Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...

متن کامل

Riemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis

Recent authors have investigated the use of manifolds and Lie group methods for independent component analysis (ICA), including the Stiefel and the Grassmann manifolds and the orthogonal group O(n). In this paper we introduce a new class of manifold, the generalized flag manifold, which is suitable for independent subspace analysis. The generalized flag manifold is a set of subspaces which are ...

متن کامل

Natural Conjugate Gradient on Complex Flag Manifolds for Complex Independent Subspace Analysis

We study the problem of complex-valued independent subspace analysis (ISA). We introduce complex flag manifolds to tackle this problem, and, based on Riemannian geometry, propose the natural conjugate gradient method on this class of manifolds. Numerical experiments demonstrate that the natural conjugate gradient method yields better convergence compared to the natural gradient geodesic search ...

متن کامل

Probabilistic Solutions to Differential Equations and their Application to Riemannian Statistics

We study a probabilistic numerical method for the solution of both boundary and initial value problems that returns a joint Gaussian process posterior over the solution. Such methods have concrete value in the statistics on Riemannian manifolds, where nonanalytic ordinary differential equations are involved in virtually all computations. The probabilistic formulation permits marginalising the u...

متن کامل

On the k-nullity foliations in Finsler geometry

Here, a Finsler manifold $(M,F)$ is considered with corresponding curvature tensor, regarded as $2$-forms on the bundle of non-zero tangent vectors. Certain subspaces of the tangent spaces of $M$ determined by the curvature are introduced and called $k$-nullity foliations of the curvature operator. It is shown that if the dimension of foliation is constant, then the distribution is involutive...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007